On Nesterov's smooth Chebyshev-Rosenbrock function

نویسنده

  • Florian Jarre
چکیده

We discuss a modification of the chained Rosenbrock function introduced by Nesterov. This function rN is a polynomial of degree four defined for x ∈ R. Its only stationary point is the global minimizer x∗ = (1, 1, . . . , 1) with optimal value zero. A point x in the box B := {x | −1 ≤ xi ≤ 1 for 1 ≤ i ≤ n} with rN (x) = 1 is given such that there is a continuous descent path within B that starts at x and leads to x∗. It is shown that any continuous piecewise linear descent path starting at x consists of at least an exponential number of 0.72 · 1.618 linear segments before reducing the value of rN to 0.25. Moreover, there exists a uniform bound, independent of n, on the Lipschitz constant for the second derivative of rN within B.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A note about the complexity of minimizing Nesterov's smooth Chebyshev-Rosenbrock function

This short note considers and resolves the apparent contradiction between known worst-case complexity results for first and second-order methods for solving unconstrained smooth nonconvex optimization problems and a recent note by Jarre (2011) implying a very large lower bound on the number of iterations required to reach the solution’s neighbourhood for a specific problem with variable dimension.

متن کامل

On Nesterov’s nonsmooth Chebyshev–Rosenbrock functions

We discuss two nonsmooth functions on Rn introduced by Nesterov. We show that the first variant is partly smooth in the sense of Lewis and that its only stationary point is the global minimizer. In contrast, we show that the second variant has 2n−1 Clarke stationary points, none of them local minimizers except the global minimizer, but also that its only Mordukhovich stationary point is the glo...

متن کامل

A geometric alternative to Nesterov's accelerated gradient descent

We propose a new method for unconstrained optimization of a smooth and strongly convex function, which attains the optimal rate of convergence of Nesterov’s accelerated gradient descent. The new algorithm has a simple geometric interpretation, loosely inspired by the ellipsoid method. We provide some numerical evidence that the new method can be superior to Nesterov’s accelerated gradient descent.

متن کامل

Optimized first-order methods for smooth convex minimization

We introduce new optimized first-order methods for smooth unconstrained convex minimization. Drori and Teboulle [5] recently described a numerical method for computing the N-iteration optimal step coefficients in a class of first-order algorithms that includes gradient methods, heavy-ball methods [15], and Nesterov's fast gradient methods [10,12]. However, the numerical method in [5] is computa...

متن کامل

Differentiability of Distance Functions and a Proximinal Property Inducing Convexity

In a normed linear space X, consider a nonempty closed set K which has the property that for some r > 0 there exists a set of points xo € X\K, d(xoK) > r, which have closest points p(xo) € K and where the set of points xo — r((xo — p(xo))/\\xo — p(zo)||) is dense in X\K. If the norm has sufficiently strong differentiability properties, then the distance function d generated by K has similar dif...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Optimization Methods and Software

دوره 28  شماره 

صفحات  -

تاریخ انتشار 2013